Stabilized Sequential Quadratic Programming for Optimization and a Stabilized Newton-type Method for Variational Problems without Constraint Qualifications∗
نویسندگان
چکیده
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence of sSQP had been previously established under the secondorder sufficient condition for optimality (SOSC) and the Mangasarian-Fromovitz constraint qualification, or under the strong second-order sufficient condition for optimality (in that case, without constraint qualification assumptions). We prove a stronger superlinear convergence result than the above, assuming SOSC only. In addition, our analysis is carried out in the more general setting of variational problems, for which we introduce a natural extension of sSQP techniques. In the process, we also obtain a new error bound for Karush-Kuhn-Tucker systems for variational problems.
منابع مشابه
Stabilized sequential quadratic programming for optimization and a stabilized Newton-type method for variational problems
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence of sSQP had been previously established under the strong second-order sufficient condition for op...
متن کاملA quasi-Newton strategy for the sSQP method for variational inequality and optimization problems
The quasi-Newton strategy presented in this paper preserves one of the most important features of the stabilized Sequential Quadratic Programming method, the local convergence without constraint qualifications assumptions. It is known that the primal-dual sequence converges quadratically assuming only the second-order sufficient condition. In this work, we show that if the matrices are updated ...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملStabilized Sequential Quadratic Programming
Recently, Wright proposed a stabilized sequential quadratic programming algorithm for inequality constrained optimization. Assuming the Mangasarian-Fromovitz constraint qualification and the existence of a strictly positive multiplier (but possibly dependent constraint gradients), he proved a local quadratic convergence result. In this paper, we establish quadratic convergence in cases where bo...
متن کاملInexact Josephy–Newton framework for variational problems and its applications to optimization
We propose and analyze a perturbed version of the classical Josephy-Newton method for solving generalized equations, and of the sequential quadratic programming method for optimization problems. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version [9, 2], sequential quadratically constrained quadratic programming [1, 4...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007